A Modified Activation Function with Improved Run-Times For Neural Networks

نویسندگان

  • Vincent Ike Anireh
  • Emmanuel N. Osegi
چکیده

In this paper we present a modified version of the Hyperbolic Tangent Activation Function as a learning unit generator for neural networks. The function uses an integer calibration constant as an approximation to the Euler number, e, based on a quadratic Real Number Formula (RNF) algorithm and an adaptive normalization constraint on the input activations to avoid the vanishing gradient. We demonstrate the effectiveness of the proposed modification using a hypothetical and real world dataset and show that lower run-times can be achieved by learning algorithms using this function leading to improved speed-ups and learning accuracies during training. Key Terms: Key Terms: Adaptive Normalization, Hyperbolic Tangent Activation Function, Neural Networks, Real Number Formula, Vanishing Gradient Problem N.E. Osegi System Analytics Laboratories (SAL) Sure-GP Ltd Port-Harcourt Rivers State, Nigeria. E-mail: [email protected] V.I.E Anireh Department of Computer Science Rivers State University of Science and Technology Port-Harcourt Rivers State, Nigeria. E-mail: [email protected]

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Implementation of a programmable neuron in CNTFET technology for low-power neural networks

Circuit-level implementation of a novel neuron has been discussed in this article. A low-power Activation Function (AF) circuit is introduced in this paper, which is then combined with a highly linear synapse circuit to form the neuron architecture. Designed in Carbon Nanotube Field-Effect Transistor (CNTFET) technology, the proposed structure consumes low power, which makes it suitable for the...

متن کامل

Handwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns

The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...

متن کامل

Application of Wavelet Neural Networks for Improving of Ionospheric Tomography Reconstruction over Iran

In this paper, a new method of ionospheric tomography is developed and evaluated based on the neural networks (NN). This new method is named ITNN. In this method, wavelet neural network (WNN) with particle swarm optimization (PSO) training algorithm is used to solve some of the ionospheric tomography problems. The results of ITNN method are compared with the residual minimization training neura...

متن کامل

Image Backlight Compensation Using Recurrent Functional Neural Fuzzy Networks Based on Modified Differential Evolution

In this study, an image backlight compensation method using adaptive luminance modification is proposed for efficiently obtaining clear images.The proposed method combines the fuzzy C-means clustering method, a recurrent functional neural fuzzy network (RFNFN), and a modified differential evolution.The proposed RFNFN is based on the two backlight factors that can accurately detect the compensat...

متن کامل

A Nonlinear Model of Economic Data Related to the German Automobile Industry

Prediction of economic variables is a basic component not only for economic models, but also for many business decisions. But it is difficult to produce accurate predictions in times of economic crises, which cause nonlinear effects in the data. Such evidence appeared in the German automobile industry as a consequence of the financial crisis in 2008/09, which influenced exchange rates and a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1607.01691  شماره 

صفحات  -

تاریخ انتشار 2016